Contrastive Hebbian Learning in the Continuous Hopfield Model

نویسنده

  • Javier R. Movellan
چکیده

This pape.r shows that contrastive Hebbian, the algorithm used in mean field learning, can be applied to any continuous Hopfield model. This implies that non-logistic activation functions as well as self connections are allowed. Contrary to previous approaches, the learning algorithm is derived without considering it a mean field approximation to Boltzmann machine learning. The paper includes a discussion of the conditions under which the function that contrastive Hebbian mini~ mizes can be considered a proper error function, and an analysis of five different training regimes. An appendix provides complete demonstrations and specific instructions on how to implement contrastive Hebbian learning in interactive activation and competition models (a convenient version of the continuous Hopfield model).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Architecture for the Learning of Perceptually Grounded Word Meanings

In this statement, we discuss two kinds of properties that a grounded model of the learning of word meaning should have, those related to the way in which linguistic and non-linguistic processing should interact and those related to the representational demands placed on such a model. We also introduce Playpen (Gasser & Colunga 1997) a neural network architecture with these properties. Playpen ...

متن کامل

Using Contrastive Hebbian Learning to Model Early Auditory Processing

We present a model of early auditory processing using the Symmetric Diffusion Network (SDN) architecture, a class of multi-layer, parallel distributed processing model based on the principles of continuous, stochastic, adaptive, and interactive processing [Movellan & McClelland, 1993]. From a computational perspective, a SDN can be viewed as a continuous version of the Boltzmann machine; that i...

متن کامل

High Performance Associative Memories and Structured Weight Dilution

The consequences of two techniques for symmetrically diluting the weights of the standard Hopfield architecture associative memory model, trained using a non-Hebbian learning rule, are examined. This paper reports experimental investigations into the effect of dilution on factors such as: pattern stability and attractor performance. It is concluded that these networks maintain a reasonable leve...

متن کامل

Performance Analysis of Hopfield Model of Neural Network with Evolutionary Approach for Pattern Recalling

ABSTRACT In the present paper, an effort has been made to compare and analyze the performance for pattern recalling with conventional hebbian learning rule and with evolutionary algorithm in Hopfield Model of feedback Neural Networks. A set of ten objects has been considered as the pattern set. In the Hopfield type of neural networks of associative memory, the weighted code of input patterns pr...

متن کامل

Correlated sequence learning in a network of spiking neurons using maximum likelihood

Hopfield Networks are an idealised model of distributed computation in networks of non-linear, stochastic units. We consider the learning of correlated temporal sequences using Maximum Likelihood, deriving a simple Hebbian-like learning rule that is capable of robustly storing multiple sequences of correlated patterns. We argue that the learning rule is optimal for the case of long temporal seq...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1990